167 research outputs found

    Probabilistic Reconciliation of Count Time Series

    Full text link
    We propose a principled method for the reconciliation of any probabilistic base forecasts. We show how probabilistic reconciliation can be obtained by merging, via Bayes' rule, the information contained in the base forecast for the bottom and the upper time series. We illustrate our method on a toy hierarchy, showing how our framework allows the probabilistic reconciliation of any base forecast. We perform experiment in the reconciliation of temporal hierarchies of count time series, obtaining major improvements compared to probabilistic reconciliation based on the Gaussian or the truncated Gaussian distribution

    Quasi-static and low-velocity impact behavior of intraply hybrid flax/basalt composites

    Get PDF
    In an attempt to increase the low-velocity impact response of natural fiber composites, a new hybrid intraply woven fabric based on flax and basalt fibers has been used to manufacture laminates with both thermoplastic and thermoset matrices. The matrix type (epoxy or polypropylene (PP) with or without a maleated coupling agent) significantly affected the absorbed energy and the damage mechanisms. The absorbed energy at perforation for PP-based composites was 90% and 50% higher than that of epoxy and compatibilized PP composites, respectively. The hybrid fiber architecture counteracted the influence of low transverse strength of flax fibers on impact response, irrespective of the matrix type. In thermoplastic laminates, the matrix plasticization delayed the onset of major damage during impact and allowed a better balance of quasi-static properties, energy absorption, peak force, and perforation energy compared to epoxy-based composites

    The Lack of BLR in Low Accretion Rate AGN as Evidence of their Origin in the Accretion Disk

    Full text link
    In this paper we present evidence suggesting that the absence or presence of Hidden Broad Line Regions (HBLRs) in Seyfert 2 galaxies is regulated by the rate at which matter accretes onto a central supermassive black hole, in units of Eddington rate. We use the intrinsic (i.e. unabsorbed) X-ray luminosities of these sources and their black hole masses (estimated by using the well known relationship between nuclear mass and bulge luminosity in galaxies) to derive the nuclear accretion rate in units of Eddington. We find that virtually all HBLR sources have accretion rate larger than a threshold value of m˙thres≃10−3\dot{m}_{thres} \simeq 10^{-3} (in Eddington units), while non-HBLR sources lie at \dot{m} \ls \dot{m}_{thres}. These data nicely fit predictions from a model proposed by Nicastro (2000), in which the Broad Line Regions (BLRs) are formed by accretion disk instabilities occurring in proximity of the critical radius at which the disk changes from gas pressure dominated to radiation pressure dominated. This radius diminishes with decreasing accretion rates; for low enough accretion rates (and therefore luminosities), the critical radius becomes smaller than the innermost stable orbit, and BLRs cannot form.Comment: 11 pages, 1 figure. Accepted for publication in the ApJ

    A note on imprecise Monte Carlo over credal sets via importance sampling.

    Get PDF
    This brief paper is an exploratory investigation of how we can apply sensitivity analysis over importance sampling weights in order to obtain sampling estimates of lower previsions described by a parametric family of distributions. We demonstrate our results on the imprecise Dirichlet model, where we can compare with the analytically exact solution. We discuss the computational limitations of the approach, and propose a simple iterative importance sampling method in order to overcome these limitations. We find that the proposed method works pretty well, at least in the example studied, and we discuss some further possible extensions

    Occurrence of Intestinal Pseudo-obstruction in a Brainstem Hemorrhage Patient

    Get PDF
    Intestinal pseudo-obstruction is a massive colonic dilation with signs and symptoms of colonic obstruction, but without a mechanical cause. A 49-year-old female patient complained of nausea, vomiting, and abdominal distension 1 month after a massive brainstem hemorrhage. No improvement was seen with conservative treatments. An extended-length rectal tube was inserted to perform glycerin enema. In addition, bethanechol (35 mg per day) was administered to stimulate colonic motility. The patient's condition gradually improved over a 2-month period without any surgical intervention. Extended length rectal tube enema and bethanechol can be used to improve intestinal pseudo-obstruction in stroke patients

    Imprecise swing weighting for multi-attribute utility elicitation based on partial preferences.

    Get PDF
    We describe a novel approach to multi-attribute utility elicitation which is both general enough to cover a wide range of problems, whilst at the same time simple enough to admit reasonably straightforward calculations. We allow both utilities and probabilities to be only partially specified, through bounding. We still assume marginal utilities to be precise. We derive necessary and sufficient conditions under which our elicitation procedure is consistent. As a special case, we obtain an imprecise generalization of the well known swing weighting method for eliciting multi-attribute utility functions. An example from ecological risk assessment demonstrates our method

    Imprecise swing weighting for multi-attribute utility elicitation based on partial preferences

    Get PDF
    We describe a novel approach to multi-attribute utility elicitation which is both general enough to cover a wide range of problems, whilst at the same time simple enough to admit reasonably straightforward calculations. We allow both utilities and probabilities to be only partially specified, through bounding. We still assume marginal utilities to be precise. We derive necessary and sufficient conditions under which our elicitation procedure is consistent. As a special case, we obtain an imprecise generalization of the well known swing weighting method for eliciting multi-attribute utility functions. An example from ecological risk assessment demonstrates our method

    HST unveils a compact mildly relativistic Broad Line Region in the candidate true type 2 NGC 3147

    Get PDF
    NGC 3147 has been considered the best case of a true type 2 AGN: an unobscured AGN, based on the unabsorbed compact X-ray continuum, which lacks a broad line region (BLR). However, the very low luminosity of NGC 3147 implies a compact BLR, which produces very broad lines, hard to detect against the dominant background host galaxy. Narrow (0.1"x0.1") slit HST spectroscopy allowed us to exclude most of the host galaxy light, and revealed an Hα\alpha line with an extremely broad base (FWZI∼27 000\sim27\,000 km s−1^{-1}). The line profile shows a steep cutoff blue wing and an extended red wing, which match the signature of a mildly relativistic thin accretion disk line profile. It is indeed well fit with a nearly face on thin disk, at i∼23∘i\sim23^\circ, with an inner radius at 77±1577\pm15 rg_g, which matches the prediction of 62−14+1862^{+18}_{-14} rg_g from the RBLR∼L1/2R_{\rm BLR} \sim L^{1/2} relation. This result questions the very existence of true type 2 AGN. Moreover, the detection of a thin disk, which extends below 100 rg_g in an L/LEdd∼10−4L/L_{\rm Edd}\sim10^{-4} system, contradicts the current view of the accretion flow configuration at extremely low accretion rates.Comment: 6 pages, 3 figures, accepted for publication in MNRAS Letter

    Efficient algorithms for checking avoiding sure loss.

    Get PDF
    Sets of desirable gambles provide a general representation of uncertainty which can handle partial information in a more robust way than precise probabilities. Here we study the effectiveness of linear programming algorithms for determining whether or not a given set of desirable gambles avoids sure loss (i.e. is consistent). We also suggest improvements to these algorithms specifically for checking avoiding sure loss. By exploiting the structure of the problem, (i) we slightly reduce its dimension, (ii) we propose an extra stopping criterion based on its degenerate structure, and (iii) we show that one can directly calculate feasible starting points in various cases, therefore reducing the effort required in the presolve phase of some of these algorithms. To assess our results, we compare the impact of these improvements on the simplex method and two interior point methods (affine scaling and primal-dual) on randomly generated sets of desirable gambles that either avoid or do not avoid sure loss. We find that the simplex method is outperformed by the primal-dual and affine scaling methods, except for very small problems. We also find that using our starting feasible point and extra stopping criterion considerably improves the performance of the primal-dual and affine scaling methods
    • …
    corecore